Code-Reuse Attack Detection Using Kullback-Leibler Divergence in IoT

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rényi Divergence and Kullback-Leibler Divergence

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

متن کامل

Kullback-Leibler Divergence Based Detection of Repackaged Android Malware

Android applications are widely used by millions of users to perform many activities. Unfortunately, legitimate and popular applications are targeted by malware authors and they repackage the existing applications by injecting additional code intended to perform malicious activities without the knowledge of end users. Thus, it is important to validate applications for possible repackaging befor...

متن کامل

Server-side code injection attack detection based on Kullback-Leibler distance

In this paper, we apply a well-known measure from information theory domain called Kullback-Leibler distance (or divergence) (KLD) to detect the symptoms of code injection attacks early during programme runtime. We take advantage of the observation that during code injection attack, the intended structure deviates from the expected structure. The KLD can be a suitable measure to capture the dev...

متن کامل

Use of Kullback–Leibler divergence for forgetting

Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, us...

متن کامل

Vector Quantization by Minimizing Kullback-Leibler Divergence

This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International journal of advanced smart convergence

سال: 2016

ISSN: 2288-2847

DOI: 10.7236/ijasc.2016.5.4.54